Randomized Subspace Newton Convex Method Applied to Data-Driven Sensor Selection Problem

نویسندگان

چکیده

The randomized subspace Newton convex methods for the sensor selection problem are proposed. algorithm is straightforwardly applied to formulation, and customized method in which part of update variables selected be present best candidates also considered. In converged solution, almost same results obtained by original randomized-subspace-Newton methods. As expected, require more computational steps while they reduce total amount time because one step significantly reduced cubic ratio numbers randomly updating all variables. shows superior performance straightforward implementation terms quality sensors time.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed Sensor Selection using a Truncated Newton Method

We propose a new distributed algorithm for computing a truncated Newton method, where the main diagonal of the Hessian is computed using belief propagation. As a case study for this approach, we examine the sensor selection problem, a Boolean convex optimization problem. We form two distributed algorithms. The first algorithm is a distributed version of the interior point method by Joshi and Bo...

متن کامل

A Convex Relaxation Approach to the Affine Subspace Clustering Problem

Abstract. Prototypical data clustering is known to su↵er from poor initializations. Recently, a semidefinite relaxation has been proposed to overcome this issue and to enable the use of convex programming instead of ad-hoc procedures. Unfortunately, this relaxation does not extend to the more involved case where clusters are defined by parametric models, and where the computation of means has t...

متن کامل

Truncated regularized Newton method for convex minimizations

Recently, Li et al. (Comput. Optim. Appl. 26:131–147, 2004) proposed a regularized Newton method for convex minimization problems. The method retains local quadratic convergence property without requirement of the singularity of the Hessian. In this paper, we develop a truncated regularized Newton method and show its global convergence. We also establish a local quadratic convergence theorem fo...

متن کامل

Regularized Newton method for unconstrained convex optimization

We introduce the regularized Newton method (rnm) for unconstrained convex optimization. For any convex function, with a bounded optimal set, the rnm generates a sequence that converges to the optimal set from any starting point. Moreover the rnm requires neither strong convexity nor smoothness properties in the entire space. If the function is strongly convex and smooth enough in the neighborho...

متن کامل

A Newton-Like Method for Convex Functions

A Newton-like method for convex functions is derived. It is shown that this method can be better than the Newton method. Especially good results can be obtained if we combine these two methods. Illustrative numerical examples are given. Mathematics Subject Classification: 65H05

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Signal Processing Letters

سال: 2021

ISSN: ['1558-2361', '1070-9908']

DOI: https://doi.org/10.1109/lsp.2021.3050708